Subsampling the Gibbs Sampler: Variance Reduction

نویسندگان

  • Steven N. MacEachern
  • Mario Peruggia
چکیده

Subsampling the output of a Gibbs sampler in a non-systematic fashion can improve the e ciency of marginal estimators if the subsampling strategy is tied to the actual updates made. We illustrate this point by example, approximation, and asymptotics. The results hold both for random scan and xed scan Gibbs samplers.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Norges Teknisk-naturvitenskapelige Universitet Antithetic Coupling of Two Gibbs Sampler Chains

SUMMARY Two coupled Gibbs sampler chains, both with invariant probability density , are run in parallel in such a way that the chains are negatively correlated. This allows us to define an asymp-totically unbiased estimator of the expectation E(f (X)) with respect to which achieves significant variance reduction with respect to the usual Gibbs sampler at comparable computational cost. We show t...

متن کامل

Antithetic Coupling of Two Gibbs Sampler Chains

Two coupled Gibbs sampler chains, both with invariant probability density , are run in parallel in such a way that the chains are negatively correlated. This allows us to define an estimator of the expectation E with respect to which achieves significant variance reduction with respect to the usual Gibbs sampler at comparable computational costs. We show that the asymptotic variance of the esti...

متن کامل

2 5 Ju n 20 15 Markov Interacting Importance Samplers

We introduce a new Markov chain Monte Carlo (MCMC) sampler called the Markov Interacting Importance Sampler (MIIS). The MIIS sampler uses conditional importance sampling (IS) approximations to jointly sample the current state of the Markov Chain and estimate conditional expectations, possibly by incorporating a full range of variance reduction techniques. We compute Rao-Blackwellized estimates ...

متن کامل

Outperforming the Gibbs Sampler Empirical Estimator for Nearest Neighbor Random Elds

Given a Markov chain sampling scheme, does the standard empirical estimator make best use of the data? We show that this is not so and construct better estimators. We restrict attention to nearest neighbor random elds and to Gibbs samplers with deterministic sweep, but our approach applies to any sampler that uses reversible variable-at-a-time updating with deterministic sweep. The structure of...

متن کامل

Efficient Training of LDA on a GPU by Mean-for-Mode Estimation

We introduce Mean-for-Mode estimation, a variant of an uncollapsed Gibbs sampler that we use to train LDA on a GPU. The algorithm combines benefits of both uncollapsed and collapsed Gibbs samplers. Like a collapsed Gibbs sampler — and unlike an uncollapsed Gibbs sampler — it has good statistical performance, and can use sampling complexity reduction techniques such as sparsity. Meanwhile, like ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999